Goto

Collaborating Authors

 Hoboken




An Information Theoretic Perspective on Conformal Prediction

Neural Information Processing Systems

More precisely, we prove three different ways to upper bound the intrinsic uncertainty, as described by the conditional entropy of the target variable given the inputs, by combining CP with information theoretical inequalities.








Credal Learning Theory

Neural Information Processing Systems

Statistical learning theory is the foundation of machine learning, providing theoretical bounds for the risk of models learned from a (single) training set, assumed to issue from an unknown probability distribution.